Goto

Collaborating Authors

 gran turismo sophy


A.I. has mastered 'Gran Turismo' -- and one autonomous car designer is taking note

NPR Technology

The Gran Turismo Sophy A.I. does a lap of the course. The Gran Turismo Sophy A.I. does a lap of the course. An artificial intelligence program has beaten the world's best players in the popular PlayStation racing game Gran Turismo Sport, and in doing so may have contributed towards designing better autonomous vehicles in the real world, according to one expert. The latest development comes after an interesting couple of decades for A.I. playing games. It began with chess, when world champion Garry Kasparov lost to IBM's Deep Blue in a match in 1997.


Gran Turismo Sophy, A New AI By Sony - Pioneering Minds

#artificialintelligence

Gran Turismo Sophy is an artificial intelligence developed in-house at Sony. Sony claims it can race against the best Gran Turismo players in the world. It has been trained using the game’s engine and can score over 100 points after months of training. Sophy is trained through a deep reinforcement learning system, using Sony Interactive Entertainment’s cloud gaming infrastructure.  Sony calls it a different kind of AI to the likes of AlphaStar and OpenAI Five, which are developed for RTS games. The AI has to learn how to drive a car and deal with simulated physics. While Sony AI’s goal was to create artificial intelligence that could compete with the best Gran Turismo drivers, the team also saw the value in creating an AI that would be enjoyable for the best drivers to race against. Gran Turismo Sophy was created in such a way that it does neither feel unfair nor appear outlandishly superhuman.


Sony Announces Gran Turismo Sophy Artificial Intelligence Project

#artificialintelligence

Disclosure: Crunchyroll is part of Funimation Global Group, a joint venture between Sony Pictures Entertainment and Aniplex. Sony Interactive Entertainment, Gran Turismo developer Polyphony Digital and the newly established Sony AI division have jointly announced the development of an articifical intelligence agent that has learned how to play the Gran Turismo franchise, with the results that it has become a formidable opponent, and gives the franchise's top sim racers a run for their money. The agent, known as Gran Turismo Sophy was developed by Sony's AI division and used Sony Interactive Entertainment's cloud gaming infrastructure in close collaboration with Polyphony Digital to build a training platform and related infrastructure to learn from hundreds of Gran Turismo Sport players in two special in-game events called Race Together in 2021, which in turn allowed it to develop its own skills and surpass the abilities of the franchise's top sim racers. A new short documentary has been released elaborating on the development of the project and is embedded below. Two full-length races featuring Gran Turismo Sophy have also been uploaded to YouTube.


Sony's AI race car driver beat the world's best humans

#artificialintelligence

Sony has developed what it's calling a breakthrough artificial intelligence program for the Gran Turismo series of PlayStation racing games. The software, called Gran Turismo Sophy, is so sophisticated, Sony says, that it handily beat a group of the world's best virtual race car drivers in test version of the 2017 game Gran Turismo Sport in October. "Outracing human drivers so skillfully in a head-to-head competition represents a landmark achievement for AI," Chris Gerdes, a Stanford professor specializing in autonomous driving, wrote in a Nature article published alongside Sony's research. Gerdes said this research could one day affect self-driving car development, according to Wired. "GT Sophy's success on the track suggests that neural networks might one day have a larger role in the software of automated vehicles than they do today," Gerdes wrote.


Sony's new AI beats humans in Gran Turismo racing game

The Japan Times

Sony Group Corp. said on Wednesday it had created an artificial intelligence agent called Gran Turismo Sophy (GT Sophy) that was able to beat world's best drivers of the PlayStation racing simulation game Gran Turismo. To get GT Sophy ready for the game, different units of Sony brought in fundamental AI research, a hyper-realistic real world racing simulator, and infrastructure for massive scale AI training, the company said in a statement. The AI first raced against four top Gran Turismo drivers in July, learned from the race and outperformed the human drivers in another race in October. "It took about 20 PlayStations running simultaneously for about 10 to 12 days to train GT Sophy to race from scratch to superhuman level," said Peter Wurman, director of Sony AI America and the leader of the team who designed the AI. While AI had been used to defeat humans in the games of chess, mahjong and go, Sony said the difficulty in mastering race car driving was the many decisions that need to be made in real time.